![]() A METHOD OF INTERACTION BETWEEN A DEVICE AND A INFORMATION CARRIER WITH A TRANSPARENT ZONE (S)
专利摘要:
A computer-implemented method of interacting with a device (110; 210) with a touch screen (111; 211), comprising: - detecting the presence of an information carrier (101; 201) in superposition on the screen (111; 211) , wherein the information carrier (101; 201) has at least one transparent zone (102; 202); - determining the location of the information carrier (101; 201) on the screen (111; 211); and - altering at least one portion of an image (112A, 112B; 212B, 213B) displayed on a portion of the screen surface (111; 211) covered by the at least one transparent area (102 ; 202) of the information carrier (101; 201). The at least one portion of the image (112A, 112B; 212B, 213B) displayed behind the at least one transparent zone (102; 202) of the information carrier (101; 201) together with an image (204) that is printed on the information carrier a combined image. 公开号:BE1022308B1 申请号:E2014/0119 申请日:2014-02-20 公开日:2016-03-15 发明作者:Steven Karel Maria Nietvelt 申请人:Cartamundi Turnhout Nv; IPC主号:
专利说明:
A METHOD OF INTERACTION BETWEEN A DEVICE AND A INFORMATION CARRIER WITH A TRANSPARENT ZONE (S) Field of the Invention The present invention generally relates to a device with screen and an information carrier, e.g. a card of paper, cardboard or plastic on which information such as text and images are printed. The invention relates in particular to advanced virtual interaction between such a device with screen and an information carrier which has transparent parts through which corresponding parts of the screen of the device can be viewed while being covered by the information carrier. The invention also proposes advanced virtual interaction between such a device and such an information carrier while the information carrier is moved across the screen of the device. Background of the Invention Various methods and systems that allow interaction or virtual interaction between an object, e.g., a finger, stylus or card, have already been described in the literature. US patent US 8,381,135 entitled "Proximity Detector in Handheld Device" describes, for example, detection of an object, e.g. a finger or stylus, in the vicinity of a touch screen, and enlarging a portion of the graphical user interface (Graphical User) Interface or GUI) near the * detected object or displaying a GUI element near the detected object. Specific embodiments described in US 8,381,135 include the representation of a virtual control element e.g. a virtual scroll wheel as shown in FIG. 17B of US 8,381,135 or a virtual keyboard as shown in FIG. 17J of US 8,381,135 or locally magnifying the displayed content as shown in FIG. 19A / FIG. 19B of US 8,381,135. US 8,381,135, however, does not disclose any interaction or virtual interaction between a screen and an information carrier, e.g., a map. The main focus is on input from the user, i.e. a person who touches the screen with his finger or a Stylus. US 8,381,135 does not propose to detect the presence of information carriers that have transparent portions and does not rely on the presence of such transparent portions to select which portion (s) of the displayed image will be modified. [05] US Patent Application US 2011/0201427 entitled "Electronic Game with Overlay Card" describes interaction between a game console with a touch screen and a card. The card contains a pattern that guides the user in making gestures, e.g. with a Stylus, that interact with the touch screen. As a result of the interaction with the user, the card will be detected and identified and a reaction will be performed that will affect the game. For example, the response may be to change a portion of the game that is displayed on the touch screen. [06] In US 2011/0201427 there is no interaction with only the card or information carrier. Input from the user, e.g., a user following a specific pattern with a Stylus, is still required and therefore detection and identification of the card remains subject to errors. In addition, US 2011/0201427 does not track card movements, making it impossible to establish virtual interaction between the GUI and a card that is moved across the screen. It also remains impossible to assign virtual activity, eg a magnification effect, X-ray scan effect, night vision viewer, ... to the card or information carrier. (. [07] The article "The metaDESK: Models and Prototypes for Tangible User Interfaces" by the authors Brygg Ullmer and Hiroshi Ishii, published in the report of UIST '97, October 14-17, 1997, describes a system comprising a table, i.e. a graphic surface projected almost horizontally against the rear, and a passive lens with an optically transparent surface through which the table projects. The architecture of the system known from Ullmer and Ishii is shown in FIG. 8 in the article mentioned above. A position detection device, e.g. a "Flock of Birds" sensor, tracks movement of the passive lens over the table. The displayed graphic elements are updated behind the transparent surface of the passive lens. If a map of the MIT campus is displayed on the table, an orthographic aerial photograph can be displayed on the part of the table behind the transparent surface of the passive lens. In this way, the user has the advanced experience that the passive lens converts the map information into photographic information. [08] The system known from Ullmer and Ishii contains complex, heavy and expensive hardware such as a table, a passive lens with connection options to the table, and computer-controlled observation or "Flock of Birds" sensors to control movement of the passive lens. to follow. Ullmer and Ishii, on the other hand, do not convert any utensil such as a touch-enabled laptop, tablet or smartphone into a device that interacts virtually with a card or data carrier that has transparent zones. The passive lens does not form an information carrier per se, with the result that the metaDESK known from Ullmer and Ishii does not create a combined image that is the result of information printed on an information carrier and visual elements displayed on parts of the screen that are covered by transparent parts of such an information carrier. It is an object of the present invention to solve the above-mentioned shortcomings of the existing solutions. More specifically, it is an objective to describe a method for advanced interaction between a screen and an information carrier, whereby it is possible to assign virtual activity to transparent areas in the information carrier and to create a combined effect of image ( and) displayed and image (s) printed on the information carrier. It is an additional object of the present invention to enable such advanced interaction between a screen and information carrier when the information carrier is moved along the surface of the screen. Summary of the invention [10] According to the present invention, the above identified shortcomings of existing solutions are solved by the computer-implemented method for interacting with a touch screen device as defined by claim 1, the computer-implemented method comprising: - detecting of the presence of an information carrier in superposition on the screen, the information carrier having at least one transparent zone; - determining the location of the information carrier on the screen; and - modifying at least one portion of an image displayed on the screen, the at least one portion of the image being displayed on a portion of the surface of the screen covered by the at least one transparent zone of the information carrier, wherein the at least one portion of the image being displayed behind the at least one transparent zone of the information carrier and an image printed on the information carrier together form a combined image. [11] The present invention thus consists of realizing virtual interaction between a device with a touch screen, e.g. a desktop, laptop, tablet, mini-tablet, smartphone, GSM, game console, media player, etc. and an information carrier with transparent (e) part (s), eg a game card, loyalty card, collector card, etc., the non-transparent part (s) of which are typically printed with information, eg text, images, cartoons, etc. The information carrier has one or more transparent zones in the shape of a circle, triangle, square, monocle, binocular, lens or other random shape. First, the presence of the information carrier on or near the touch screen is detected. There are various technologies for detecting the presence of an object such as the information carrier with transparent zones: capacitive detection of conductive elements integrated in the information carrier, reading a tag (e.g. an RFID tag) that is integrated in the information carrier, recognition of a touch pattern performed by the user on the basis of instructions on the information carrier, etc. Then the location of the information carrier on the screen is determined. In other words, the present invention is aware of the location of the information carrier, e.g., a map, on the touch screen of the device. At least the part (s) of the screen that is (are) covered by the transparent part (s) of the information carrier is (are) subsequently modified to establish a virtual interaction between the screen and the information carrier. To this end, knowledge about the location of the transparent part (s) must be available in the information carrier: this knowledge may be predefined, in other words the computer-implemented method is aware because all information carriers have the same structure with transparent part (s). s) at the same predefined location (s), or alternatively, the location of the transparent part (s) should be learned as will be explained in more detail below. The modified portions of the displayed image, covered by the transparent portions of the information carrier and image (s) printed on the non-transparent portions of the information carrier, together form a scene or effect for the user. The modified portions of the displayed image may, for example, locally enlarge the image, while the non-transparent portions of the information carrier may be printed as the housing of binoculars. The combined effect for the user would be that he / she uses binoculars that enhance the advanced reality and user experience. [12] Optionally, as defined by claim 2, the computer-implemented method for interacting with a touch screen device according to the present invention further comprises: - tracking the location of the information carrier as it is moved along the screen; and - modifying at least one portion of an image displayed on the screen, the at least one portion of the image being displayed on a portion of the surface of the screen currently covered by the at least one one transparent zone of the information carrier that is moved across the screen. [13] A specific embodiment of the present invention thus continuously follows the location of the information carrier. In other words, such an embodiment is aware of the current location of the information carrier at any time. This knowledge and knowledge about the location of the transparent part (s) in the information carrier can then be used to further supplement the virtual interaction between the screen and the map or information carrier. The portions of the displayed image that are being changed follow the current location of the transparent area (s) of the information carrier so that the movements of the information carrier over the screen determine which part (s) of the screen is currently changing ( change). [14] Also optionally, as defined by claim 3, the computer-implemented method for interacting with a touch screen device according to the present invention further comprises: - identifying the information carrier; and - determining the location of the at least one transparent zone in the information carrier in response to identification of the information carrier. [15] If the information carrier is effectively unique, e.g. different cards each having a unique label or machine-readable integrated code, the information carrier can be identified by scanning, detecting or reading its unique label or code. On the basis of the identification of the map, the location of the transparent zone (s) can be deduced, for example by consulting a list or database. The combined knowledge about the location of the information carrier, which is permanently monitored according to the present invention, and the location of the transparent zone (s) in the information carrier, makes it possible at any time to divide portions of the displayed image that are covered by change the transparent zone (s). [16] Alternatively, as defined by claim 4, the computer-implemented method for interacting with a touch screen device according to the present invention further comprises: - identifying a type of information carrier; and - determining the location of the at least one transparent zone in the information carrier in response to identification of the type. [17] Different types of cards or information carriers can indeed be distributed with respect to a specific embodiment of the present invention. Each type of map can have the transparent part (s) at (a) specific fixed location (s), but these locations may differ for different types of cards. For example, a "monocle" map may have a single, circular, transparent zone at a predetermined location in the map, a "binocular" map may have two circular, transparent zones at predetermined locations in the map, etc. the type of map, e.g. by detecting a label or code attached to or integrated into the map, may be sufficient to obtain knowledge about the location of the transparent zone (s) in the map. Again, thanks to the combined knowledge about the location of the information carrier, which is permanently monitored in accordance with the present invention, and the location of the transparent zone (s) in the information carrier, as determined by the type of map, portions of the displayed image can be displayed. covered by the transparent zone (s) at any time. [18] Further optionally, as defined by claim 5, the computer-implemented method for interacting with a touch screen device according to the present invention further comprises: - detecting an additional confirmation gesture on or near the touch screen. [19] The computer-implemented method according to the present invention can thus detect confirmation by the user, e.g. touching a specific zone on the information carrier or on the screen with a finger, Stylus or other object. In the case that the transparent zone (s) in the information carrier, for example, serves (act) as a virtual magnifying glass with which a small or hidden item in a displayed image can be searched for, the user can make a confirmation gesture when searching for it item found. The computer-implemented method according to the invention can then control the display of a new image, e.g. a home screen, a screen with the next level, a score, etc. [20] In an embodiment of the computer-implemented method for interacting with a touch screen device according to the invention, defined by claim 6, modifying at least a portion of an image comprises displaying information relating to quiz questions, answers to such quiz questions and / or scores obtained by answering such quiz questions. [21] Thus, the present invention can be used for advanced interaction during a quiz. The card or information carrier can determine which specific type of quiz is started. The location of the map on the touch screen typically remains unchanged during the quiz. The card is placed in a predetermined position on the touch screen. This can be achieved by a map whose dimensions correspond to the dimensions of the screen, e.g. in the case the device is a smartphone, or by displaying markings indicating the position of the map on the screen, or by the initial detect the position of the map via various location-determining techniques as described above. Once the position of the card on the screen is known, the areas of the screen behind the transparent area (s) of the card can be used to display quiz questions, possible answers to such quiz questions, scores obtained by answering such quiz questions and various items such as still and moving images that are part of a quiz question or the possible answers to such a quiz question. [22] In an alternative embodiment of the computer-implemented method for interacting with a touch screen device according to the invention, as defined by claim 7, modifying at least one portion of an image comprises magnifying a portion of information that is part of the image. [23] In this embodiment of the invention, the information carrier or more precisely the transparent part (s) thereof forms a virtual magnifying glass. In case the non-transparent parts of the information carrier are printed with the housing of a monocle or binoculars, the combined printed and displayed images create a new general image of a monocle or binoculars. This allows the user, for example, to search and find information with virtual help from a map on a displayed image that is impossible or difficult to find with the naked eye. [24] In an alternative embodiment of the computer-implemented method for interacting with a touch screen device according to the present invention, as defined by claim 8, modifying at least one portion of an image includes displaying an item that is hidden is in the image. [25] In this embodiment, the user card or data carrier becomes a virtual tool that reveals hidden items, e.g., an animal hidden behind leaves in a forest, as soon as the user moves the transparent area of the card to the location in the displayed image where the item is hidden. [26] In yet another alternative embodiment of the computer-implemented method for interacting with a touch screen device according to the present invention, as defined by claim 9, modifying at least one portion of an image comprises displaying a virtual X-ray scan of a portion of the image. [27] In this embodiment of the invention, the user card or information carrier becomes a virtual X-ray camera that helps to visualize portions of a human or animal skeleton by moving a transparent portion of the card along the body part of which the user wants to see the X-ray image. It goes without saying that different alternatives are possible in which, instead of an X-ray filter, other types of filters are applied to the displayed image in the areas covered by a transparent part of the map. [28] In yet another embodiment of the computer-implemented method for interacting with a touch screen device according to the present invention, as defined by claim 10, modifying at least one portion of an image comprises displaying a virtual night vision viewer of a portion of the image. [29] Thus, a black or rather dark image displayed on the touch screen can be scanned by means of a virtual night vision device, i.e. a map with transparent zones that locally changes the image to an infrared representation of the scene. [30] According to an optional aspect of the computer-implemented method for interacting with a touch screen device according to the present invention, as defined by claim 11, the at least one transparent zone of the information carrier can be colored. [31] Such a colored transparent part, e.g. realized via the integration of a colored, transparent film in the information carrier, can for instance enable the visualization of a specific image on the screen. In addition to a computer-implemented method for interacting with a touch screen device, the present invention also relates to a corresponding data processing system as defined by claim 12, comprising means for performing the method according to the invention. Furthermore, the present invention also relates to an associated computer program, as defined by claim 13, comprising Software code adapted to implement the computer-implemented method according to the invention, and on a computer-readable storage medium as defined by claim 14, comprising such a computer program. [34] As defined by claim 15, the present invention also comprises an apparatus operable to cause a touch screen to display visual elements, the apparatus operable to: - detect presence of an information carrier in superposition on the screen, wherein the information carrier has at least one transparent zone; - determine the location of the information carrier on the screen; and - modify at least one portion of an image displayed on the screen, the at least one portion of the image being displayed on a portion of the surface of the screen covered by the at least one transparent zone of the information carrier. [35] In an advantageous embodiment of the device operable to have a touch screen display visual elements according to the invention, defined by claim 16, the device is further operable to: - track the location of the information carrier when passing along the screen is moved; and - modifying at least one portion of an image displayed on the screen, the at least one portion of the image being displayed on a portion of the surface of the screen currently covered by the at least one transparent zone of the information carrier moved along the screen, the at least one portion of the image being displayed behind the at least one transparent zone of the information carrier and an image printed on the information carrier forming a combined image. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1A illustrates an information carrier in a first embodiment of the method according to the invention; FIG. 1B illustrates a touch screen device in the first embodiment of the method according to the invention; FIG. 1C illustrates the initial positioning of the information carrier on the touch screen device in the first embodiment of the method according to the invention; FIG. 1D illustrates the movement of the information carrier over the touch screen device in the first embodiment of the method according to the invention; FIG. 1E illustrates the execution of a confirmation gesture in the first embodiment of the method according to the invention; FIG. 2A illustrates an information carrier in a second embodiment of the method according to the invention; FIG. 2B illustrates a touch screen device in the second embodiment of the method according to the invention; FIG. 2C illustrates the initial positioning of the information carrier on the touch screen device in the second embodiment of the method according to the invention; FIG. 2D illustrates the movement of the information carrier over the touch screen device in the second embodiment of the method according to the invention; and [45] FIG. 2E illustrates the execution of a confirmation gesture in the second embodiment of the method according to the invention. Detailed description of the embodiments), [46] FIG. 1A shows a card 101, e.g. made of paper, cardboard or plastic. The card 101 has a circular portion 102 that has been made transparent and two smaller circular zones 103A and 103B that are marked. The latter marked zones 103A and 103B are intended for finger contact as soon as the card is on the touch screen of a device capable of running a software application that interacts with the card 101 according to the principles of the present invention. FIG. 1B shows a device 110 with a touch screen 111, i.e. a screen with a capacitive layer that responds to touch by objects such as a finger or stylus. A software application is running on the device 110 that displays an image of a tree on screen 111. In FIG. 1C, a person card 101 has been placed on the touch screen 111 of the device 110. The software application running on device 110 detects the presence of card 101 and is able to identify the card 101. To make this possible, card 101 may have a unique integrated conductive pattern or may include instructions for the user to form a touch pattern that enables touch screen device 110 under the control of the software application to recognize the card 101. Once the presence and identification of the card 101 is determined, the user 120 must touch the marked zones 103A and 103B with two fingers to allow the software application to determine the exact location of the card 101 on the touch screen 111. Knowledge about the location of the map 101 and identification of the map 101 is sufficient for the software application to be able to determine the location of the transparent circular portion 102. On the screen 111, the software application will modify the portion 112A of the image displayed in the circular zone 102 to display an element that was previously hidden. In the specific example illustrated by FIG. 1C, the software application controls the graphical user interface (GUI) of device 110 to display an image 112A of an owl sitting on the lower branch of the tree shown in FIG. 1B. In FIG. 1D, the user 120 has moved the card 101 along the touch screen 111 of the device 110 to a new position. While doing so, the user 120 has kept two fingers in contact with the marked zones 103A and 103B, respectively. This allows the capacitive layer of touch screen 101 to follow movement of the card 101 and in response, the software application running on device 110 and controlling the images being displayed can immediately change the portion of the displayed image behind the transparent circular zone 102. Hidden elements are displayed when the transparent circular zone 102 crosses them. In the specific example of FIG. 1D, an image 112B of two hanging cherries is displayed when the transparent circular zone 102 of card 101 covers the upper branch of the tree shown in FIG. 1B. [50] The card 101 and associated software application running on device 110 realize advanced interaction between the card and the screen. For example, card 101 may include instructions for the user to search for cherries in a tree. The card 101 becomes a search tool for the user since its transparent portion 102 allows the user to reveal elements hidden in the originally displayed image. As illustrated by FIG. 1E, the user can confirm that the cherries were found by means of an additional gesture, e.g., touching a finger of his second hand 130, so that points are collected, e.g., 5 additional points as shown by 113 in FIG. 1E. FIG. 2A depicts a card 201, e.g. made of paper, cardboard or plastic, which is used in a second embodiment of the present invention. The card 201 also has a circular portion 202 that is made transparent and two smaller circular zones 203A and 203B that are marked. The latter marked zones 203A and 203B are again intended for finger contact once the card 201 is on the touch screen of a device capable of running a software application that interacts with the card 201 according to the principles of the present invention. The card 201 is further printed with the image 204 of a magnifying glass positioned so that the transparent circular zone 202 coincides with the glass of the printed magnifying glass 204. FIG. 2B shows a device 210 with a touch screen 211, i.e. a screen with a capacitive layer that responds to touch by objects such as a finger or stylus. A software application running on the device 210 displays an image on screen 211, the image comprising a first figure 212A and a second figure 213A. In FIG. 2C, a person has placed card 201 on touch screen 211 of device 210. The software application running on device 210 detects the presence of card 201 and is able to identify the card 201. To make this possible, card 201 may have a unique integrated conductive pattern or may include instructions for the user to form a touch pattern that enables device 210 with touch screen controlled by the software application to recognize the card 201. Once the presence and identification of the card 201 has been determined, the user 220 must touch the marked areas 203A and 203B with two fingers to enable the software application to determine the exact location of the card 201 on the touch screen 211. Knowledge about the location of the map 201 and identification of the map 201 is sufficient for the software application to be able to determine the location of the transparent circular portion 202. On the screen 211, the software application will change the portion of the image displayed in the circular zone 202 to enlarge one or more elements displayed therein. In the specific example illustrated by FIG. 2C, the software application controls the graphical user interface (GUI) of device 210 to display an enlarged image 213B of the second figure 213A shown in FIG. 2B. In FIG. 2D, user 220 has moved map 201 along touch screen 211 of device 210 to a new position. While doing so, the user 220 has kept two fingers in contact with the marked zones 203A and 203B, respectively. This allows the capacitive layer of touch screen 201 to follow movement of the card 201 and in response, the software application running on device 210 and controlling the images being displayed can immediately change the portion of the displayed image behind the transparent circular zone 202. Elements in the image are magnified when the transparent circular zone 202 crosses them. In the specific example of FIG. 2D, an enlarged image 212B of the first figure is displayed as soon as the transparent circular zone 202 of map 201 shows the first figure 212A shown in FIG. 2B covered. The printed magnifying glass 204 and the magnified visualization of elements in portions of the screen 211 that are covered by the transparent zone 202 result in a combined new image for the user, i.e., a virtual magnifying glass. The card 201 and associated software application running on device 210 realize advanced interaction between the card 210 and the screen 211. For example, the card 201 may include instructions for the user to search for a particular figure in a displayed image. The card 201 becomes a search tool for the user since its transparent portion 202 allows the user to magnify elements that are hardly or not visible or perceptible in the original image displayed. As illustrated by FIG. 2E, the user can confirm by means of an additional gesture, e.g., touching a finger of his second hand 230, that the figure was found, so that points are collected, e.g., 5 additional points as represented by 214 in FIG. 2E. The method according to the invention will typically be computer implemented on a data processing system or computer. A data processing system or computer operated according to the present invention can be a workstation, server, laptop, desktop, portable device, mobile device, tablet or other computer device, as will be understood by those skilled in the art. [57] The data processing system or computer may include a bus or network for connectivity between different components, directly or indirectly, a memory or database, one or more processors, input / output ports, a power supply, etc. A person skilled in the art will recognize that the bus or network may comprise one or more buses, such as an address bus, data bus, or any combination thereof, or one or more network connections. A person skilled in the art will furthermore recognize that, depending on the intended applications and uses of a specific embodiment, several of these components can be implemented by a single device. Similarly, in some cases, a single component can be implemented by multiple devices. [58] The data processing system or computer may contain or interact with a variety of computer readable media. For example, computer readable media may include the following: Random Access Memory (RAM), Read Only Memory (ROM), Electronically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, CDROM, Digital Versatile Disk (DVD) or others optical or holographic media, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices that can be used to encode information and that can be accessed by the data processing system or computer. [59] The memory may contain computer storage media in the form of volatile and / or non-volatile memory. The memory can be removable, non-removable or any combination thereof. Examples of hardware devices are devices such as hard disks, solid state memories, optical disk drives, and the like. The data processing system or the computer may comprise one or more processors that read data from components such as the memory, the various I / O components, etc. [60] The I / O ports can allow the data processing system or computer to be logically linked to other devices such as I / O components. Some of the l / O components may be built into the computer. Examples of such I / O components are a microphone, joystick, recording device, game pad, satellite dish, scanner, printer, wireless device, network device or the like. [61] Although the present invention has been illustrated with reference to specific embodiments, it will be apparent to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be practiced with various changes and modifications without leaving the scope of the invention. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being described by the appended claims and not by the foregoing description, and all modifications falling within the meaning and scope of the claims, are therefore included here. In other words, it is assumed that this covers all changes, variations or equivalents that fall within the scope of the underlying basic principles and whose essential attributes are claimed in this patent application. In addition, the reader of this patent application will understand that the words "comprising" or "include" do not exclude other elements or steps, that the word "a" does not exclude a plural, and that a single element, such as a computer system, a processor or other integrated unit can fulfill the functions of different tools mentioned in the claims. Any references in the claims should not be construed as limiting the claims in question. The terms "first", "second", "third", "a", "b", "c" and the like, when used in the description or in the claims, are used to distinguish between similar elements or steps and do not necessarily describe a sequential or chronological order. Similarly, the terms "top", "bottom", "over", "under" and the like are used for the purposes of the description and do not necessarily refer to relative positions. It is to be understood that those terms are interchangeable under proper conditions and that embodiments of the invention are capable of functioning in accordance with the present invention in sequences or orientations other than described or illustrated above.
权利要求:
Claims (16) [1] CONCLUSIONS A computer-implemented method for interacting with a device (110; 210) with a touch screen (111; 211), said computer-implemented method comprising: - detecting the presence of an information carrier (101; 201) in superposition on said screen (111; 211), wherein said information carrier (101; 201) has at least one transparent zone (102; 202); - determining the location of said information carrier (101; 201) on said screen (111; 211); and - modifying at least one portion of an image (112A, 112B; 212B, 213B) displayed on said screen (111; 211), said at least one portion of said image (112A, 112B; 212B, 213B) ) is displayed on a portion of the surface of said screen (111; 211) which is covered by said at least one transparent zone (102; 202) of said information carrier (101; 201), the at least one portion of said image (112A, 112B; 212B, 213B) displayed behind said at least one transparent zone (102; 202) of said information carrier (101; 201) and an image (204) printed on said information carrier together form a combined image. [2] A computer-implemented method for interacting with a device (110; 210) with a touch screen (111; 211) according to claim 1, said computer-implemented method further comprising: - tracking the location of said information carrier (101; 201) ) when moved along said screen (111; 211); and - modifying at least one portion of an image (112A, 112B; 212B, 213B) displayed on said screen (111; 211), said at least one portion of said image (112A, 112B; 212B, 213B) is displayed on a portion of the surface of said screen (111; 211) that is currently covered by said at least one transparent zone (102; 202) of said information carrier (101; 201) passing along said screen (111) ; 211) is moved. [3] A computer-implemented method for interacting with a device (110; 210) with a touch screen (111; 211) according to claim 1 or claim 2, said computer-implemented method further comprising: - identifying said information carrier (101; 201) ); and - determining the location of said at least one transparent zone (102; 202) in said information carrier (101; 201) in response to identification of said information carrier (101; 201). [4] A computer-implemented method for interacting with a device (110; 210) with a touch screen (111; 211) according to claim 1 or claim 2, said computer-implemented method further comprising: - identifying a type of said information carrier ( 101; 201); and - determining the location of said at least one transparent zone (102; 202) in said information carrier (101; 201) in response to identification of said type. [5] A computer-implemented method for interacting with a device (110; 210) with a touch screen (111; 211) according to claim 1 or claim 2, said computer-implemented method further comprising: - detecting an additional acknowledgment gesture (130; 230) on or near said touch screen (111; 211). [6] A computer-implemented method for interacting with a touch screen device according to claim 1, wherein modifying at least a portion of an image comprises displaying information regarding quiz questions, answers to such quiz questions and / or scores that are obtained by answering such quiz questions. ♦ f [7] A computer-implemented method for interacting with a device (210) with a touch screen (211) according to claim 1 or claim 2, wherein modifying at least one portion of an image (212B, 213B) includes magnifying a portion of information (212A, 213A) that forms part of said image. 2 [8] A computer-implemented method for interacting with a device (110) with a touch screen (111) according to claim 1 or claim 2, wherein changing at least one portion of an image (112A, 112B) comprises displaying an item that is hidden in said image. [9] A computer-implemented method for interacting with a touch screen device according to claim 1 or claim 2, wherein modifying at least one portion of an image comprises displaying a virtual X-ray scan of a portion of said image. [10] A computer-implemented method for interacting with a touch screen device according to claim 1 or claim 2, wherein changing at least one portion of an image comprises displaying a virtual night vision viewer of a portion of said image. [11] A computer-implemented method for interacting with a touch screen device according to claim 1 or claim 2, wherein said at least one transparent zone (102; 202) of said information carrier (101; 201) is colored. [12] A data processing system comprising means for performing the computer-implemented method according to any of claims 1 to 11. [13] A computer program comprising software code adapted to perform the computer-implemented method of any one of claims 1 to 11 ». [14] A computer-readable medium comprising the computer program of claim 13. [15] A device operative to cause a touch screen (111; 211) to display visual elements, said device operable to: - detect the presence of an information carrier (101; 201) in superposition on said screen (110; 210) ), wherein said information carrier (101; 201) has at least one transparent zone (102; 202); - determine the location of said information carrier (101; 201) on said screen (111; 211); and - modifying at least one portion of an image (112A, 112B; 212B, 213B) displayed on said screen (111; 211), said at least one portion of said image (112A, 112B; 212B, 213B) is displayed on a portion of the surface of said screen (111; 211) that is covered by said at least one transparent zone (102; 202) of said information carrier (101; 201), said at least one portion of said image ( 112A, 112B; 212B, 213B) displayed behind said at least one transparent zone (102; 202) of said information carrier (101; 201) and an image (204) printed on said information carrier together form a combined image. [16] An apparatus operable to cause a touch screen (111; 211) to display visual elements as defined by claim 15, wherein said apparatus is further operable to: - the location of said information carrier (101; 201) when moved along said screen (111; 211) to follow; and - modifying at least one portion of an image (112A, 112B; 212B, 213B) displayed on said screen (111; 211), said at least one portion of said image (112A, 112B; 212B, 213B) is displayed on a portion of the surface of said screen (111; 211) currently covered by said at least one transparent zone (102; 202) of said information carrier (101; 201) passing along said screen (111; 211) ) is moved.
类似技术:
公开号 | 公开日 | 专利标题 US9939914B2|2018-04-10|System and method for combining three-dimensional tracking with a three-dimensional display for a user interface US20210365159A1|2021-11-25|Mobile device interfaces US11087550B2|2021-08-10|Wearable electronic glasses with eye tracking US10001845B2|2018-06-19|3D silhouette sensing system US10943403B2|2021-03-09|Object preview in a mixed reality environment RU2654145C2|2018-05-16|Information search method and device and computer readable recording medium thereof BE1022308B1|2016-03-15|A METHOD OF INTERACTION BETWEEN A DEVICE AND A INFORMATION CARRIER WITH A TRANSPARENT ZONE | JP2015516624A|2015-06-11|Method for emphasizing effective interface elements CN107408100A|2017-11-28|Sight is used for automatic page turning US9639167B2|2017-05-02|Control method of electronic apparatus having non-contact gesture sensitive region US11243678B2|2022-02-08|Method of panning image EP2960769A1|2015-12-30|Method for providing data input using a tangible user interface US10114540B2|2018-10-30|Interactive user interface including layered sub-pages Yang2015|Variable reality: interacting with the virtual book JP5895658B2|2016-03-30|Display control apparatus and display control method CN108885638A|2018-11-23|Inking for numerical map inputs のためのトラッキングおよびインタラクション手法の設計2013|Design of Tracking and Interaction Techniques for Touch-sensitive Tangibles in Tabletop Environments
同族专利:
公开号 | 公开日 US20160062482A1|2016-03-03| EP2796977A1|2014-10-29| WO2014173549A1|2014-10-30|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20100302144A1|2009-05-28|2010-12-02|Microsoft Corporation|Creating a virtual mouse input device| US20050134578A1|2001-07-13|2005-06-23|Universal Electronics Inc.|System and methods for interacting with a control environment| US7394459B2|2004-04-29|2008-07-01|Microsoft Corporation|Interaction between objects and a virtual environment display| US8381135B2|2004-07-30|2013-02-19|Apple Inc.|Proximity detector in handheld device| US7576725B2|2004-10-19|2009-08-18|Microsoft Corporation|Using clear-coded, see-through objects to manipulate virtual objects| EP1922602A2|2005-08-11|2008-05-21|N-trig Ltd.|Apparatus for object information detection and methods of using same| US7993201B2|2006-02-09|2011-08-09|Disney Enterprises, Inc.|Electronic game with overlay card| WO2009143086A2|2008-05-17|2009-11-26|Qwizdom, Inc.|Digitizing tablet devices, methods and systems| US8421761B2|2009-08-26|2013-04-16|General Electric Company|Imaging multi-modality touch pad interface systems, methods, articles of manufacture, and apparatus| US20110095994A1|2009-10-26|2011-04-28|Immersion Corporation|Systems And Methods For Using Static Surface Features On A Touch-Screen For Tactile Feedback| US9235233B2|2010-10-01|2016-01-12|Z124|Keyboard dismissed on closure of device| CN102830903A|2012-06-29|2012-12-19|鸿富锦精密工业(深圳)有限公司|Electronic equipment and memorandum adding method of electronic equipment| US9971495B2|2013-01-28|2018-05-15|Nook Digital, Llc|Context based gesture delineation for user interaction in eyes-free mode| US20140300555A1|2013-04-05|2014-10-09|Honeywell International Inc.|Avionic touchscreen control systems and program products having "no look" control selection feature|JP5925347B1|2015-02-26|2016-05-25|株式会社Cygames|Information processing system and program, server, terminal, and medium| US10386940B2|2015-10-30|2019-08-20|Microsoft Technology Licensing, Llc|Touch sensing of user input device| FR3064086A1|2017-03-14|2018-09-21|Orange|PRESSURE BUTTON FOR TOUCH SURFACE, PHYSICAL INTERFACE AND PROTECTION| US11194464B1|2017-11-30|2021-12-07|Amazon Technologies, Inc.|Display control using objects|
法律状态:
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 EP131651119|2013-04-24| EP20130165111|EP2796977A1|2013-04-24|2013-04-24|A method for interfacing between a device and information carrier with transparent area| WO131651119|2013-04-24| PCT/EP2014/051579|WO2014173549A1|2013-04-24|2014-01-28|A method for interfacing between a device and information carrier with transparent area| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|